37 research outputs found

    Multiscale Network Analysis for Financial Contagion

    Get PDF
    Contagion in financial markets has been one the most active areas of research, especially during the last decade and due to the major incidents during the Global Financial Crisis and the European Financial Crisis. However, two of the most important questions that remain after a financial crisis are what are the determinants of the crisis and how can we forecast an incident based on suitable indicators. The purpose of this study is twofold. First, to develop a measure of contagion based on the multiscale nature of the financial contagion. Second, to examine how financial contagion is spread in the US economy in different frequencies based on the proposed measure. We assert that important information on an upcoming crisis, not observed in the original data, may be revealed by performing a time-frequency analysis of the time-series and the cross-section of stock returns. We use wavelet analysis to decompose the returns and network analysis to compute various network characteristics related to contagion. Our proposed methodology allow us to: understand the short-, mid- and long-term connections of the network, bring out structures/relations that are not visible initially and mask the true connections between companies, study how the networks measures change over scale, and finally, examine the distribution of contagion at different time-horizons and scales

    A Comparison between Wavelet Networks and Genetic Programming in the Context of Temperature Derivatives

    Get PDF
    The purpose of this study is to develop a model that accurately describes the dynamics of the daily average temperature in the context of weather derivatives pricing. More precisely we compare two state of the art machine learning algorithms, namely wavelet networks and genetic programming, against the classic linear approaches widely used in the pricing of temperature derivatives in the financial weather market and against various machine learning benchmark models such as neural networks, radial basis functions and support vector regression. The accuracy of the valuation process depends on the accuracy of the temperature forecasts. Our proposed models are evaluated and compared in-sample and out-of-sample in various locations where weather derivatives are traded. Furthermore, we expand our analysis by examining the stability of the forecasting models relative to the forecasting horizon. Our findings suggest that the proposed nonlinear methods significantly outperform the alternative linear models, with wavelet networks ranking first, and can be used for accurate weather derivative pricing in the weather market

    Educación para refugiados en Grecia: ¿integración o segregación?

    Get PDF
    El cierre de la “ruta de los Balcanes” en la primavera de 2016 ha dejado a cerca de 21 000 niños atrapados en Grecia. Aunque las políticas educativas fueron diseñadas para integrar a esos niños en el sistema de educación griego, estas políticas en realidad dieron como resultado la segregación de algunos estudiantes

    Denoising the Equity Premium

    Get PDF
    Previous studies have shown that a variety of economic variables fails to deliver consistently accurate out-of-sample forecasts for the equity premium. In this study we propose a wavelet denoising framework in the context of equity premium forecasting. First, we decompose the time-series using wavelet analysis and then we remove the noise in different frequencies. Our results show that the proposed method improves the forecasting ability of linear models indicating that wavelet denoising can successfully identify the underlying persistent signal in the equity premium time-series. Motivated by this, we apply various linear and nonlinear models such as wavelet networks and neural networks to further improve the accuracy of our forecasts

    Automatic Mass Valuation for Non-Homogeneous Housing Markets

    Get PDF
    In recent years big financial institutions are interested in creating and maintaining property valuation models. The main objective is to use reliable historical data in order to be able to forecast the price of a new property in a comprehensive manner and provide some indication for the uncertainty around this forecast. The need for unbiased, objective, systematic assessment of real property has always been important. This need is urgent now as banks need assurance that they have appraised a property on a fair value before issuing a loan and also as the government needs to know the fair market value of a property in order to determine accordingly the annual property tax. In this study we compare various linear, nonlinear and machine learning approaches. We apply a large set of variables, supported by the literature, describing the characteristics of the real estate properties as well as transformation of these variables. The final set consists of 60 variables. We answer the question of variables selection by extracting all available information with the use of several shrinkage methods, machine learning techniques, dimensionality reduction techniques and combination forecasts. The forecasting ability of each method is evaluated out-of-sample is a set of over 30,000 real estate properties from the Greek housing market which is both inefficient and non-homogeneous. Special care is given on measuring the success of the forecasts but also on identifying the property characteristics that lead to large forecasting errors. Finally, by examining the strengths and the performance of each method we apply a combined forecasting rule to improve forecasting accuracy

    Global financial crisis and multiscale systematic risk: Evidence from selected European stock markets

    Get PDF
    In this paper, we have investigated the impact of the global financial crisis on the multi-horizon nature of systematic risk and market risk using daily data of eight major European equity markets over the period of 2005-2018. The method is based on a wavelet multiscale approach within the framework of a capital asset pricing model. Empirical results demonstrate that beta coefficients have a multiscale tendency and betas tend to increase at higher scales (lower frequencies). In addition, the size of betas and R2s tend to increase during the crisis period compared with the pre-crisis period. The multiscale nature of the betas is consistent with the fact that stock market investors have different time horizons due to different trading strategies. Our results based on scale dependent value-at-risk (VaR) suggest that market risk tends to be more concentrated at lower time scales (higher frequencies) of the data. Moreover, the scale-by-scale estimates of VaR have increased almost three fold for every market during the crisis period compared with the pre-crisis period. Finally, our approach allows for accurately forecasting time-dependent betas and VaR

    Real Estate valuation and forecasting in non-homogeneous markets: A case study in Greece during the financial crisis

    Get PDF
    In this paper we develop an automatic valuation model for property valuation using a large database of historical prices from Greece. The Greek property market is an inefficient, non-homogeneous market, still at its infancy and governed by lack of information. As a result modelling the Greek real estate market is a very interesting and challenging problem. The available data covers a big range of properties across time and includes the Greek financial crisis period which led to tremendous changes in the dynamics of the real estate market. We formulate and compare linear and nonlinear models based on regression, hedonic equations, spatial analysis and artificial neural networks. The forecasting ability of each method is evaluated out-of-sample. Special care is given on measuring the success of the forecasts but also to identify the property characteristics that lead to large forecasting errors. Finally, by examining the strengths and the performance of each method we apply a combined forecasting rule to improve performance. Our results indicate that the proposed methodology constitutes an accurate tool for property valuation in non- homogeneous, newly developed markets

    Equity premium prediction: The role of information from the options market

    Get PDF
    This paper examines the role of information from the options market in forecasting the equity premium. We provide empirical evidence that the equity premium is predictable out-of-sample using a set of CBOE strategy benchmark indices as predictors. We use a range of econometric approaches to generate point, quantile and density forecasts of the equity premium, and we find that models based on option variables consistently outperform the historical average benchmark. In addition to statistical gains, using option predictors results in substantial economic benefits for a mean-variance investor, delivering up to a fivefold increase in certainty equivalent returns over the benchmark during the 1996-2021 sample period

    Feature extraction and identification techniques for the alignment of perturbation simulations with power plant measurements

    Get PDF
    In this work, a methodology is proposed for the comparison of the measured and simulated neutron noise signals in nuclear power plants, with the simulation sets having been generated by the CORE SIM+ diffusion-based reactor noise simulator. More specifically, the method relies on the computation of the Cross-Power Spectral Density of the detector signals and the subsequent comparison with their simulated counterparts, which involves specific frequency values corresponding to the signals’ high energy content. The different simulated perturbations considered are (i) axially traveling perturbations, (ii) fuel assembly vibrations, (iii) core barrel vibrations, and finally (iv) generic “absorber of variable strength” types. The reactor core used for the current study is a German 4-loop pre-Konvoi Pressurized Water Reactor

    Machine learning based prediction of soil total nitrogen, organic carbon and moisture content by using VIS-NIR spectroscopy

    Get PDF
    It is widely known that the visible and near infrared (VIS-NIR) spectroscopy has the potential of estimating soil total nitrogen (TN), organic carbon (OC) and moisture content (MC) due to the direct spectral responses these properties have in the near infrared (NIR) region. However, improving the prediction accuracy requires advanced modelling techniques, particularly when measurement is planned for fresh (wet and un-processed) soil samples. The aim of this work is to compare the predictive performance of two linear multivariate and two machine learning methods for TN, OC and MC. The two multivariate methods investigated included principal component regression (PCR) and partial least squares regression (PLSR), whereas the machine learning methods included least squares support vector machines (LS-SVM), and Cubist. A mobile, fibre type, VIS-NIR spectrophotometer was utilised to collect soil spectra (305–2200 nm) in diffuse reflectance mode from 140 wet soil samples collected from one field in Germany. The results indicate that machine learning methods are capable of tackling non-linear problems in the dataset. LS-SVMs and the Cubist method out-performed the linear multivariate methods for the prediction of all three soil properties studied. LS-SVM provided the best prediction for MC (root mean square error of prediction (RMSEP) = 0.457% and residual prediction deviation (RPD) = 2.24) and OC (RMSEP = 0.062% and RPD = 2.20), whereas the Cubist method provided the best prediction for TN (RMSEP = 0.071 and RPD = 1.96)
    corecore